Skip to content

Conversation

ZachCafego
Copy link
Contributor

@ZachCafego ZachCafego commented Feb 20, 2025

Copy link
Member

@jrobble jrobble left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: 0 of 1 files reviewed, 2 unresolved discussions


docker-compose.components.yml line 159 at r1 (raw file):

  
  llava-detection-server:
    <<: *detection-component-base

This is not a component and should not extend from the component base.


docker-compose.components.yml line 167 at r1 (raw file):

      memlock: -1
      stack: 67108864
    ports:

This is for exposing ports so that the server can be accessed externally to Docker. Since only your LLAVA component service needs to access it, I think this is unnecessary and should be removed.

Copy link
Contributor Author

@ZachCafego ZachCafego left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: 0 of 1 files reviewed, 2 unresolved discussions (waiting on @jrobble)


docker-compose.components.yml line 159 at r1 (raw file):

Previously, jrobble (Jeff Robble) wrote…

This is not a component and should not extend from the component base.

Done.


docker-compose.components.yml line 167 at r1 (raw file):

Previously, jrobble (Jeff Robble) wrote…

This is for exposing ports so that the server can be accessed externally to Docker. Since only your LLAVA component service needs to access it, I think this is unnecessary and should be removed.

Done.

@jrobble
Copy link
Member

jrobble commented Mar 14, 2025

docker-compose.components.yml line 167 at r2 (raw file):

      stack: 67108864
    environment:
      NVIDIA_VISIBLE_DEVICES: all

Making a note to consider adding:

OLLAMA_KEEP_ALIVE=-1

to address losing GPU connectivity when ollama unloads the model after inactivity.

Issue mentioned here: ollama/ollama#6928 (comment)

@jrobble
Copy link
Member

jrobble commented Jun 18, 2025

docker-compose.components.yml line 159 at r2 (raw file):

  
  llava-detection-server:
    image: ${REGISTRY}ollama_server:${TAG}

Change to openmpf_llava_detection_server

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants